Locally Recurrent Networks With

نویسندگان

  • John G. Harris
  • Jose C. Principe
چکیده

M U LT I P L E TI M E-S CA L ES Jui-Kuo Juan, John G. Harris and Jose C. Principe Computational Neuro-Engineering Laboratory University of Florida 453 NEB Bldg, Gainesville, FL 32611 Email: [email protected] ' introduce a new generalized feed-forward structure 1 hat provides for multiple time scales. The gamma, Laguerre and other locally recurrent feed-forward structures perform poorly in cases where widely varying time constants are required. By exponentially varying the timeconstant along the delay line, a single delay line is able to represent signals that include various time scales. We demonstrate both discreteand continuous-time versions of this multiple time-scale structure which we call the multi-scale gamma filter. The multi-scale gamma has a very natural implementation in sub-threshold CMOS and measured impulse responses from a continuous-time analog VLSI chip are shown.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The importance of the optimal volume in the treatment of locally recurrent nasopharyngeal carcinoma; brachytherapy or stereotactic radiotherapy?

Introduction: Nasopharyngeal carcinoma (NPC) is commonly known as a radiosensitive tumor with the initial good response to radiation. Despite the improved outcome in loco regional control by the introduction of combining treatment, modern radiotherapy techniques and enhanced imaging studies, local recurrent after primary treatment with rate ranges from 15-58% in 5 years, stil...

متن کامل

Robust stability of stochastic fuzzy impulsive recurrent neural networks with\ time-varying delays

In this paper, global robust stability of stochastic impulsive recurrent neural networks with time-varyingdelays which are represented by the Takagi-Sugeno (T-S) fuzzy models is considered. A novel Linear Matrix Inequality (LMI)-based stability criterion is obtained by using Lyapunov functional theory to guarantee the asymptotic stability of uncertain fuzzy stochastic impulsive recurrent neural...

متن کامل

Casual BackPropagation Through Time for Locally Recurrent Neural Networks

This paper concerns dynamic neural networks for signal processing: architectural issues are considered but the paper focuses on learning algorithms that work on-line. Locally recurrent neural networks, namely MLP with IIR synapses and generalization of Local Feedback MultiLayered Networks (LF MLN), are compared to more traditional neural networks, i.e. static MLP with input and/or output buffer...

متن کامل

Causal Back Propagation through Time for Locally Recurrent Neural Networks

This paper concerns dynamic neural networks for signal processing: architectural issues are considered but the paper focuses on learning algorithms that work on-line. Locally recurrent neural networks, namely MLP with IIR synapses and generalization of Local Feedback MultiLayered Networks (LF MLN), are compared to more traditional neural networks, i.e. static MLP with input and/or output buffer...

متن کامل

Hypersurfaces of a Sasakian space form with recurrent shape operator

Let $(M^{2n},g)$ be a real hypersurface with recurrent shapeoperator and tangent to the structure vector field $xi$ of the Sasakian space form$widetilde{M}(c)$. We show that if the shape operator $A$ of $M$ isrecurrent then it is parallel. Moreover, we show that $M$is locally a product of two constant $phi-$sectional curvaturespaces.

متن کامل

Stability criteria for three-layer locally recurrent networks

The paper deals with a discrete-time recurrent neural network designed with dynamic neuron models. Dynamics are reproduced within each single neuron, hence the considered network is a locally recurrent globally feedforward. In the paper, conditions for global stability of the neural network considered are derived using the Lyapunov’s second method.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004